Online PCA with Spectral Bounds

نویسنده

  • Zohar Karnin
چکیده

This paper revisits the online PCA problem. Given a stream of n vectors xt ∈ R (columns of X) the algorithm must output yt ∈ R (columns of Y ) before receiving xt+1. The goal of online PCA is to simultaneously minimize the target dimension ` and the error ‖X − (XY +)Y ‖. We describe two simple and deterministic algorithms. The first, receives a parameter ∆ and guarantees that ‖X − (XY +)Y ‖ is not significantly larger than ∆. It requires a target dimension of ` = O(k/ε) for any k, ε such that ∆ ≥ εσ 1 + σ k+1, with σi being the i’th singular value of X . The second receives k and ε and guarantees that ‖X−(XY +)Y ‖ ≤ εσ 1 +σ k+1. It requires a target dimension of O(k log n/ε). Different models and algorithms for Online PCA were considered in the past. This is the first that achieves a bound on the spectral norm of the residual matrix.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Representing Spectral data using LabPQR color space in comparison to PCA method

In many applications of color technology such as spectral color reproduction it is of interest to represent the spectral data with lower dimensions than spectral space’s dimensions. It is more than half of a century that Principal Component Analysis PCA method has been applied to find the number of independent basis vectors of spectral dataset and representing spectral reflectance with lower di...

متن کامل

Sharp Bounds on the PI Spectral Radius

In this paper some upper and lower bounds for the greatest eigenvalues of the PI and vertex PI matrices of a graph G are obtained. Those graphs for which these bounds are best possible are characterized.

متن کامل

Online PCA with Optimal Regrets

We carefully investigate the online version of PCA, where in each trial a learning algorithm plays a k-dimensional subspace, and suffers the compression loss on the next instance when projected into the chosen subspace. In this setting, we give regret bounds for two popular online algorithms, Gradient Descent (GD) and Matrix Exponentiated Gradient (MEG). We show that both algorithms are essenti...

متن کامل

Spectral Smoothing via Random Matrix Perturbations

We consider stochastic smoothing of spectral functions of matrices using perturbations commonly studied in random matrix theory. We show that a spectral function remains spectral when smoothed using a unitarily invariant perturbation distribution. We then derive state-of-the-art smoothing bounds for the maximum eigenvalue function using the Gaussian Orthogonal Ensemble (GOE). Smoothing the maxi...

متن کامل

Error bounds in approximating n-time differentiable functions of self-adjoint operators in Hilbert spaces via a Taylor's type expansion

On utilizing the spectral representation of selfadjoint operators in Hilbert spaces, some error bounds in approximating $n$-time differentiable functions of selfadjoint operators in Hilbert Spaces via a Taylor's type expansion are given.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015